Coordinated Universal Time

Coordinated Universal Time (UTC) is the primary time standard by which the world regulates clocks and time. It is one of several closely related successors to Greenwich Mean Time. Computer servers, online services and other entities that rely on having a universally accepted time use UTC for that purpose. If only limited precision is needed, clients can obtain the current UTC time from a number of official internet UTC servers. For sub-microsecond precision, clients can obtain the time from satellite signals. Time zones around the world are expressed as positive or negative offsets from UTC, as in this list.

Coordinated Universal Time is based on International Atomic Time (TAI), a time standard calculated using a weighted average of signals from atomic clocks located in nearly 70 national laboratories around the world.[1] The only difference between the two is that UTC is occasionally adjusted by adding a leap second in order to keep it within one second of UT1, which is defined by the Earth's rotation. In the 50 years up to and including 2011, a total of 34 leap seconds have been added.

The UTC standard was officially standardized in 1961 by the International Radio Consultative Committee, after having been initiated by several national time laboratories. The system was changed several times over the following years, until leap seconds were adopted in 1972. A number of proposals have been made to replace it with a new system, which would eliminate leap seconds, but no consensus has yet been reached to do so.

Contents

Uses

UTC is the time standard used for many Internet and World Wide Web standards. The Network Time Protocol, designed to synchronise the clocks of computers over the internet, encodes times using the UTC system.[2]

UTC is also the time standard used in aviation.[3] Weather forecasts, flight plans, air traffic control clearances, and maps all use UTC to avoid confusion about time zones and daylight saving time.

Amateur radio operators often schedule their radio contacts in UTC, because transmissions on some frequencies can be picked up by many timezones.[4]

Definition and relationship to other standards

The current version of UTC is defined by International Telecommunications Union Recommendation ITU-R TF.460-6, Standard-frequency and time-signal emissions.[5] UTC is based on International Atomic Time (TAI) with leap seconds added at irregular intervals to compensate for the Earth's slowing rotation.[6] Leap seconds are used to allow UTC to closely track Universal Time (UT1), as explained later in this article.

The difference between UTC and UT1 is not allowed to exceed 0.9 seconds, so if high precision is not required, the general term Universal Time (UT) may be used.[7] The term Greenwich Mean Time (GMT) does not have a precise definition at the subsecond level, but it is often considered equivalent to UTC or UT1. Saying "GMT" often implies either UTC or UT1 when used within informal or casual contexts. In technical contexts, usage of "GMT" is avoided; the unambiguous terminology "UTC" or "UT1" is preferred.[7]

Notation

Compromise notation
Source Initials Words
English CUT Coordinated Universal Time
French TUC Temps Universel Coordonné
compromise UTC unofficial English: "Universal Time, Coordinated"; unofficial French: "Universel Temps Coordonné"[8][9]

The official notation for Coordinated Universal Time is UTC. This notation arose from a desire by the International Telecommunication Union and the International Astronomical Union to use the same notation in all languages. English speakers originally proposed "CUT" (for "coordinated universal time"), while French speakers proposed "TUC" (for "temps universel coordonné"). The compromise that emerged was UTC,[10] which conforms to the pattern for the notations of the variants of Universal Time (UT0, UT1, UT2, UT1R etc.).[11]

Mechanism

UTC divides time into days, hours, minutes and seconds. Days are conventionally identified using the Gregorian calendar, but Julian day numbers can also be used. Each day contains 24 hours and each hour contains 60 minutes. The number of seconds in a minute is usually 60, but may very rarely be 61 or 59.[12] Thus, in the UTC time scale, the second and all smaller time units (millisecond, microsecond etc.) are of constant duration, but the minute and all larger time units (hour, day, week etc.) are of variable duration. Decisions to introduce a leap second are announced at least 8 weeks in advance in "Bulletin C" produced by the International Earth Rotation and Reference Systems Service.[13][14]. The leap seconds cannot be predicted far in advance due to the unpredictable rate of rotation of the earth.[15]

Nearly all UTC days contain exactly 86,400 SI seconds, with exactly 60 seconds in each minute. However, because the mean solar day is slightly longer than 86,400 SI seconds, occasionally the last minute of a UTC day is adjusted to have 61 seconds. The extra second is called a leap second. It accounts for the grand total of the extra length (about 2 milliseconds each) of all the mean solar days since the previous leap second. The last minute of a UTC day is permitted to contain 59 seconds to cover the remote possibility of the Earth rotating faster, but that has not yet been necessary since UTC was introduced. The irregular day lengths mean that fractional Julian days do not work properly with UTC.

Since 1972, UTC is calculated by subtracting the accumulated leap seconds from International Atomic Time (TAI), which is a coordinate time scale tracking notional proper time on the rotating surface of the Earth (the geoid). In order to maintain a close approximation to UT1 (equivalent to GMT before 1960), UTC occasionally has discontinuities where it changes from one linear function of TAI to another. These discontinuities take the form of leap seconds implemented by a UTC day of irregular length. Discontinuities in UTC have occurred only at the end of a Gregorian month.[16] The International Earth Rotation and Reference Systems Service (IERS) tracks and publishes the difference between UTC and Universal Time, DUT1 = UT1 - UTC, and introduces discontinuities into UTC to keep DUT1 in the interval (-0.9 s, +0.9 s). Since 1972 the discontinuities have consisted only of a leap of one second at the end of 30 June or 31 December.[17]

As with TAI, UTC is only known with the highest precision in retrospect. Users who require an approximation in real time must obtain it from a time laboratory, which disseminates an approximation using techniques such as GPS or radio time signals. Such approximations are designated UTC(k), where k is an abbreviation for the time laboratory.[18] The time of events may be provisionally recorded against one of these approximations; later corrections may be applied using the International Bureau of Weights and Measures (BIPM) monthly publication of tables of differences between canonical TAI/UTC and TAI(k)/UTC(k) as estimated in real time by participating laboratories. (See the article on International Atomic Time for details.)

Because of time dilation, a standard clock not on the geoid, or in rapid motion, will not maintain synchronicity with UTC. Therefore, telemetry from clocks with a known relation to the geoid is used to provide UTC when required, on locations such as those of spacecraft.

UTC is a discontinuous timescale, so it is not possible to compute the exact time interval elapsed between two UTC timestamps without consulting a table that describes how many leap seconds occurred during that interval. Therefore, many scientific applications that require precise measurement of long (multi-year) intervals use TAI instead. TAI is also commonly used by systems that cannot handle leap seconds. A fixed 19-second offset from TAI also gives GPS time.

For most common and legal-trade purposes, the fractional second difference between UTC and UT (GMT) is inconsequentially small, so UTC is often called GMT (for instance, by the BBC).[19]

Time zones

Time zones are usually defined to differ from UTC by an integral number of hours,[20] although the laws of each jurisdiction would have to be consulted if sub-second accuracy was required. Several jurisdictions have established time zones that differ by an integer number of half-hours or quarter-hours from UT1 or UTC.

The UTC time zone is sometimes denoted by the letter Z—a reference to the equivalent nautical time zone (GMT), which has been denoted by a Z since about 1950. The letter also refers to the "zone description" of zero hours, which has been used since 1920 (see time zone history). Since the NATO phonetic alphabet and amateur radio word for Z is "Zulu", UTC is sometimes known as Zulu time. This is especially true in aviation, where Zulu is the universal standard.[21] This ensures all pilots regardless of location are using the same 24-hour clock, thus avoiding confusion when flying between time zones.[22] See list of military time zones for letters used in addition to Z in qualifying time zones other than Greenwich.

On electronic devices that only allow the current time zone to be configured using maps or city names, UTC can be selected indirectly by selecting Reykjavík, Iceland, which is always on UTC time and does not use daylight saving.[23]

Daylight saving

UTC does not change with a change of seasons, but local time or civil time may change if a time zone jurisdiction observes daylight saving time or summer time. For example, UTC is 5 hours ahead of (that is, later than) local time on the east coast of the United States during winter, but 4 hours ahead while daylight saving is observed.[24]

History

At the 1884 International Meridian Conference held in Washington D. C., the local mean solar time at the Royal Observatory, Greenwich in England was chosen to define the Universal day, counted from 0 hours at mean midnight. This agreed with civil Greenwich Mean Time (GMT), used on the island of Great Britain since 1847. In contrast, astronomical GMT began at mean noon, 12 hours after mean midnight of the same date until 1 January 1925, whereas nautical GMT began at mean noon, 12 hours before mean midnight of the same date, at least until 1805 in the British Navy, but persisted much later elsewhere because it was mentioned at the 1884 conference. In 1884, the Greenwich Meridian was used for two-thirds of all charts and maps as their Prime Meridian.[25] In 1928, the term Universal Time (UT) was introduced by the International Astronomical Union to refer to GMT with the day starting at midnight.[26] Until the 1950s, broadcast time signals were based on UT, and hence on the rotation of the Earth.

In 1955, the caesium atomic clock was invented. This provided a form of timekeeping that was both more stable and more convenient than astronomical observations. In 1956, the U.S. National Bureau of Standards and U.S. Naval Observatory started to develop atomic frequency time scales; by 1959 these time scales were used in generating the WWV time signals, named for the shortwave radio station that broadcasts them. In 1960 the U.S. Naval Observatory, the Royal Greenwich Observatory, and the U.K. National Physical Laboratory coordinated their radio broadcasts so time steps and frequency changes were coordinated, and the resulting time scale was informally referred to as "Coordinated Universal Time".[27]

In a controversial decision, the frequency of the signals was initially set to match the rate of UT, but then kept at the same frequency by the use of atomic clocks and deliberately allowed to drift away from UT. When the divergence grew significantly, the signal was phase shifted (stepped) by 20 ms to bring it back into agreement with UT. Twenty-nine such steps were used before 1960.[28] The signal frequency was changed less often.

In 1958, data was published linking the frequency for the caesium transition, newly established, with the ephemeris second. The ephemeris second is the duration of time that, when used as the independent variable in the laws of motion that govern the movement of the planets and moons in the solar system, cause the laws of motion to accurately predict the observed positions of solar system bodies. Within the limits of observing accuracy, ephemeris seconds are of constant length, as are atomic seconds. This publication allowed a value to be chosen for the length of the atomic second that would work properly with the celestial laws of motion.[29].

UTC was officially initiated at the start of 1961 (but the name Coordinated Universal Time was not adopted by the International Astronomical Union until 1967).[30][31] The TAI instant 1 January 1961 00:00:01.422818 exactly was identified as UTC instant 1 January 1961 00:00:00.000000 exactly, and UTC ticked exactly one second for every 1.000000015 s of TAI. Time steps occurred every few months thereafter, and frequency changes at the end of each year. The jumps increased in size to 100 ms, with only one 50 ms jump having ever occurred. This UTC was intended to permit a very close approximation of UT2, within around 0.1 s.

In 1967, the SI second was redefined in terms of the frequency supplied by a caesium atomic clock. The length of second so defined was practically equal to the second of ephemeris time.[32] This was the frequency that had been provisionally used in TAI since 1958. It was soon recognised that having two types of second with different lengths, namely the UTC second and the SI second used in TAI, was a bad idea. It was thought that it would be better for time signals to maintain a consistent frequency, and that that frequency should match the SI second. Thus it would be necessary to rely on time steps alone to maintain the approximation of UT. This was tried experimentally in a service known as "Stepped Atomic Time" (SAT), which ticked at the same rate as TAI and used jumps of 200 ms to stay synchronised with UT2.[33]

There was also dissatisfaction with the frequent jumps in UTC (and SAT). In 1968, Louis Essen, the inventor of the caesium atomic clock, and G. M. R. Winkler both independently proposed that steps should be of 1 s only.[34] This system was eventually approved, along with the idea of maintaining the UTC second equal to the TAI second. At the end of 1971, there was a final irregular jump of exactly 0.107758 TAI seconds, so that 1 January 1972 00:00:00 UTC was 1 January 1972 00:00:10 TAI exactly, making the difference between UTC and TAI an integer number of seconds. At the same time, the tick rate of UTC was changed to exactly match TAI. UTC also started to track UT1 rather than UT2. Some time signals started to broadcast the DUT1 correction (UT1 − UTC) for applications requiring a closer approximation of UT1 than UTC now provided.[35][36]

The first leap second occurred on 30 June 1972. Since then, leap seconds have occurred on average about once every 19 months, always on 30 June or 31 December. As of 1 January 2010, 00:00:00 UTC, there have been 24 leap seconds in total, all positive, putting UTC 34 seconds behind TAI (this is the case since 1 January 2009).[37]

Rationale

The Earth's rotational speed is very slowly decreasing due to tidal deceleration, causing the mean solar day to increase in length. The length of the SI second was calibrated on the basis of the second of ephemeris time[29][32] and can now be seen to have a relationship with the mean solar day observed between 1750 and 1892, analysed by Simon Newcomb. As a result, the SI second is close to 1/86400 of a mean solar day in around 1820. In earlier centuries the mean solar day was shorter than 86400 SI seconds, and in later centuries it is longer than 86400 seconds. At the end of the 20th century the length of the mean solar day (also known simply as "length of day" or "LOD") was approximately 86,400.002 s. For this reason, UT is now "slower" than TAI.

The excess of the LOD over the nominal 86,400 s accumulates over time, causing the UTC day, initially synchronised with the mean sun, to become desynchronised and run ahead of it. At the end of the 20th century, with the LOD at 2 ms above the nominal value, UTC ran faster than UT by 2 ms per day, getting a second ahead roughly every 500 days. Thus, leap seconds were inserted at approximately this interval, retarding UTC to keep it synchronised in the long term. Note that the actual rotational period varies on unpredictable factors such as tectonic motion and has to be observed, rather than computed.

The insertion of a leap second every 500 days does not indicate that the mean solar day is getting longer by a second every 500 days (just as a leap day every four years does not mean the year is getting longer by one day every four years): it will take approximately 50,000 years for a mean solar day to lengthen by one second (at a rate of 2 ms/cy). This is a mean rate within the range of 1.7–2.3 ms/cy. The rate due to tidal friction alone is about 2.3 ms/cy, but the uplift of Canada and Scandinavia by several metres since the last Ice Age has temporarily reduced this to 1.7 ms/cy over the last 2700 years.[38] The correct reason for leap seconds is not the current difference between actual and nominal LOD, but rather the accumulation of this difference over a period of time: in the late twentieth century, this difference was about 1/500 of a second per day, so it accumulated to 1 second after about 500 days.

For example, assume you start counting the seconds from the Unix epoch of 1970-01-01T00:00:00 UTC with an atomic clock. At midnight on that day (as measured on UTC), your counter registers 0 s. After Earth has made one full rotation with respect to the mean Sun, your counter will register approximately 86400.002 s (the precise value will vary depending on plate tectonic conditions). Based on your counter, you can calculate that the date is 1970-01-02T00:00:00 UT1. After 500 rotations, your counter will register 43,200,001 s. Since 86,400 s × 500 is 43,200,000 s, you will calculate that the date is 1971-05-16T00:00:01 UTC, while it is only 1971-05-16T00:00:00 UT1. If you had added a leap second on December 31, 1970, retarding your counter by 1 s, then the counter would have a value of 43,200,000 s at 1971-05-16T00:00:00 UT1, and allow you to calculate the correct date.

In the graph of DUT1 above, the excess of LOD above the nominal 86,400 s corresponds to the downward slope of the graph between vertical segments. (Note that the slope became shallower in the 2000s, due to a slight acceleration of the Earth's crust temporarily shortening the day.) Vertical position on the graph corresponds to the accumulation of this difference over time, and the vertical segments correspond to leap seconds introduced to match this accumulated difference. Leap seconds are timed to keep DUT1 within the vertical range depicted by this graph. The frequency of leap seconds therefore corresponds to the slope of the diagonal graph segments, and thus to the excess LOD.

Future

As the Earth's rotation continues to slow, positive leap seconds will be required more frequently. The long-term rate of change of LOD is approximately +1.7 ms per century. At the end of the 21st century LOD will be roughly 86,400.004 s, requiring leap seconds every 250 days. Over several centuries, the frequency of leap seconds will become problematic.

Some time in the 22nd century, two leap seconds will be required every year. The current use of only the leap second opportunities in June and December will be insufficient, and the March and September options will have to be used. In the 25th century, four leap seconds will be required every year, so the current quarterly options will be insufficient. Thereafter there will need to be the possibility of leap seconds at the end of any month. In about two thousand years even that will become insufficient, and there will have to be leap seconds that are not at the end of a month.[39]

In a few tens of thousands of years (the timing is uncertain) LOD will exceed 86,401 s, causing the current form of UTC to break down due to requiring more than one leap second per day. It would be possible to then continue with double leaps, but this becomes increasingly untenable.

Both the one-leap-second-per-month[40] and one-leap-second-per-day milestones are considered (by different theorists) to mark the theoretical limit of the applicability of UTC. The actual number of leap seconds to keep track of time would become unwieldy by current standards well before these, but presumably if UTC were to continue then horological systems would be redesigned to cope with regular leap seconds much better than current systems do.

There is a proposal to redefine UTC and abolish leap seconds, such that sundials would slowly get further out of sync with civil time.[41] The resulting gradual shift of the sun's movements relative to civil time is analogous to the shift of seasons relative to the yearly calendar that results from the calendar year not precisely matching the tropical year length. This would be a major practical change in civil timekeeping, but would take effect slowly over several centuries. ITU-R Study Group 7 and Working Party 7A were unable to reach consensus on whether to advance the proposal to the 2012 Radiocommunications Assembly, so the chairman of Study Group 7 elected to advance the question to the 2012 Radiocommunications Assembly (which is scheduled for Geneva 16–20 January).[42]

There is also a proposal that the present form of UTC could be improved to track UT1 more closely, by allowing greater freedom in scheduling leap seconds.[43]

See also

Notes

  1. ^ International Bureau of Weights and Measures 2011.
  2. ^ How NTP Works 2011.
  3. ^ Aviation Time 2006.
  4. ^ Horzepa 2010.
  5. ^ ITU Radiocommunication Assembly 2002.
  6. ^ Time Service Dept. c. 2009.
  7. ^ a b Universal Time n.d..
  8. ^ BelleSerene. "French time: "heure légale"". Yachting and Boating World Forums. http://www.ybw.com/forums/archive/index.php/t-202760.html. Retrieved 5 August 2011. 
  9. ^ "Chat Rooms". CIRCLIST. http://www.circlist.info/chatrooms.html. Retrieved 5 August 2011. 
  10. ^ National Institute of Standards and Technology 2011.
  11. ^ IAU resolutions 1976.
  12. ^ ITU Radiocommunication Assembly 2002, p. 3.
  13. ^ International Earth Rotation and Reference Systems Service 2011.
  14. ^ McCarthy & Seidelmann 2009, p. 229.
  15. ^ McCarthy & Seidelmann 2009, chapter 4.
  16. ^ History of TAI-UTC c. 2009.
  17. ^ McCarthy & Seidelmann 2009, pp. 217, 227–231.
  18. ^ McCarthy & Seidelmann 2009, p. 209.
  19. ^ Langley 1999.
  20. ^ Seidelmann 1992, p. 7.
  21. ^ Military & Civilian Time Designations n.d..
  22. ^ Williams 2005.
  23. ^ Iceland 2011.
  24. ^ Standard time 2010.
  25. ^ Howse 1997, pp. 133–137.
  26. ^ McCarthy & Seidelmann 2009, pp. 10–11.
  27. ^ McCarthy & Seidelmann 2009, pp. 226–227.
  28. ^ Arias, Guinot & Quinn 2003.
  29. ^ a b Markowitz et al. 1958.
  30. ^ Nelson & McCarthy 2005, p. 15.
  31. ^ Nelson & McCarthy 2001, p. 515.
  32. ^ a b Markowitz 1988.
  33. ^ McCarthy & Seidelmann 2009, p. 227.
  34. ^ Essen 1968, pp. 161–5.
  35. ^ Seidelmann 1992, pp. 85–87.
  36. ^ Nelson, Lombardi & Okayama 2005, p. 46.
  37. ^ Bulletin C 2011.
  38. ^ Stephenson & Morrison 1995.
  39. ^ Allen 2011a.
  40. ^ Finkleman et al. 2011.
  41. ^ Allen 2011b.
  42. ^ Seidelmann & Seago 2011, p. S190.
  43. ^ Seaman 2003.

References

External links